Skip to main content
Glama

Nornir MCP Server

by yhvh-chen
Nornir MCP.yml7.48 kB
app: description: '' icon: mechanical_arm icon_background: '#FFEAD5' mode: advanced-chat name: Nornir MCP use_icon_as_answer_icon: true dependencies: [] kind: app version: 0.2.0 workflow: conversation_variables: [] environment_variables: [] features: file_upload: allowed_file_extensions: - .JPG - .JPEG - .PNG - .GIF - .WEBP - .SVG allowed_file_types: - image allowed_file_upload_methods: - local_file - remote_url enabled: false fileUploadConfig: audio_file_size_limit: 50 batch_count_limit: 5 file_size_limit: 15 image_file_size_limit: 10 video_file_size_limit: 100 workflow_file_upload_limit: 10 image: enabled: false number_limits: 3 transfer_methods: - local_file - remote_url number_limits: 3 opening_statement: '' retriever_resource: enabled: true sensitive_word_avoidance: enabled: false speech_to_text: enabled: false suggested_questions: [] suggested_questions_after_answer: enabled: false text_to_speech: enabled: false language: '' voice: '' graph: edges: - data: isInIteration: false isInLoop: false sourceType: start targetType: agent id: 1746327259651--1746347771846-target source: '1746327259651' sourceHandle: source target: '1746347771846' targetHandle: target type: custom zIndex: 0 - data: isInIteration: false isInLoop: false sourceType: agent targetType: answer id: 1746347771846--1746349226468-target source: '1746347771846' sourceHandle: source target: '1746349226468' targetHandle: target type: custom zIndex: 0 nodes: - data: desc: '' selected: false title: 开始 type: start variables: [] height: 53 id: '1746327259651' position: x: 80 y: 282 positionAbsolute: x: 80 y: 282 selected: true sourcePosition: right targetPosition: left type: custom width: 243 - data: agent_parameters: instruction: type: constant value: You are a network automation system engineer. Use the following tools to query the system status, then answer users' questions using your expert knowledge. maximum_iterations: type: constant value: 10 mcp_servers_config: type: constant value: "{\n \"server_name1\": {\n \"transport\": \"sse\",\n \"\ url\": \"http://192.168.100.50:8002/sse\"\n }\n}" model: type: constant value: completion_params: {} mode: chat model: gemma3:12b-it-q8_0 model_type: llm provider: langgenius/ollama/ollama type: model-selector query: type: constant value: '{{#sys.query#}}' tools: type: constant value: - enabled: true extra: description: Retrieve the list of MCP server tools. parameters: {} provider_name: junjiem/mcp_sse/mcp_sse schemas: [] settings: {} tool_description: Retrieve the list of MCP server tools. tool_label: Get MCP tools list tool_name: mcp_sse_list_tools type: builtin - enabled: true extra: description: 调用 MCP 服务端工具。 parameters: arguments: auto: 1 value: null tool_name: auto: 1 value: null provider_name: junjiem/mcp_sse/mcp_sse schemas: - auto_generate: null default: null form: llm human_description: en_US: Name of the tool to execute. ja_JP: Name of the tool to execute. pt_BR: Name of the tool to execute. zh_Hans: Name of the tool to execute. label: en_US: Tool Name ja_JP: Tool Name pt_BR: Tool Name zh_Hans: Tool Name llm_description: Name of the tool to execute. max: null min: null name: tool_name options: [] placeholder: null precision: null required: true scope: null template: null type: string - auto_generate: null default: null form: llm human_description: en_US: Tool arguments (JSON string in the python dict[str, Any] format). ja_JP: Tool arguments (JSON string in the python dict[str, Any] format). pt_BR: Tool arguments (JSON string in the python dict[str, Any] format). zh_Hans: Tool arguments. label: en_US: Arguments ja_JP: Arguments pt_BR: Arguments zh_Hans: 参数 llm_description: Tool arguments (JSON string in the python dict[str, Any] format). max: null min: null name: arguments options: [] placeholder: null precision: null required: true scope: null template: null type: string settings: {} tool_description: Call an MCP server tool. tool_label: Call MCP tool tool_name: mcp_sse_call_tool type: builtin agent_strategy_label: ReAct (Support MCP Tools) agent_strategy_name: mcp_sse_ReAct agent_strategy_provider_name: junjiem/mcp_see_agent/mcp_see_agent desc: '' memory: query_prompt_template: '{{#sys.query#}}' window: enabled: true size: 50 output_schema: null plugin_unique_identifier: junjiem/mcp_see_agent:0.1.4@51e36edcd3f097a42b7fdcc84c92981a0fc19f698f4095b070bc2ddd192c864c selected: false title: Nornir-Agent type: agent height: 197 id: '1746347771846' position: x: 383 y: 282 positionAbsolute: x: 383 y: 282 selected: false sourcePosition: right targetPosition: left type: custom width: 243 - data: answer: '{{#1746347771846.text#}}' desc: '' selected: false title: 直接回复 type: answer variables: [] height: 103 id: '1746349226468' position: x: 686 y: 282 positionAbsolute: x: 686 y: 282 sourcePosition: right targetPosition: left type: custom width: 243 viewport: x: -95.5 y: -48.5 zoom: 1

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/yhvh-chen/nornir_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server